Goto

Collaborating Authors

 second term



A Additional definitions

Neural Information Processing Systems

We provide the definitions of important terms used throughout the paper. Assumption 2.3 when the demand distribution is exponential. Note that Lemma B.1 implies that In the following result, we show that there exist appropriate constants such that prior distribution satisfies Assumption 2.3 when the demand distribution is a multivariate Gaussian with unknown The proof is a direct consequence of Theorem 3.2, Lemmas B.6, B.7, B.8, B.9, and Proposition 3.2. Theorem 6.19] the prior induced by Assumption 2.2 is a direct consequence of Assumption 2.4 and 2.5 are straightforward to satisfy since the model risk function Lemma B.13. F or a given Using the result above together with Proposition 3.2 implies that the RSVB posterior converges at C.1 Alternative derivation of LCVB We present the alternative derivation of LCVB. We prove our main result after a series of important lemmas.





3eb2f1a06667bfb9daba7f7effa0284b-AuthorFeedback.pdf

Neural Information Processing Systems

The first termkxt x αk2 can be simply merged in eq. The first term is consensus error and will be merged inT1 in eq. Response: We have conducted more experiments over the ImageNet37 dataset which is known as a complicated dataset. As the middle figure38 demonstrates, for a binary classification, our method significantly im-39 proves upon the benchmarks over this dataset as well. We also carried40 out experiments on a deeper neural network with 4 hidden layers and41 ourmethod providessignificant speedups overthebenchmarks (bottom42 figure).